On Pinsker's and Vajda's Type Inequalities for Csiszár's f-Divergences
نویسنده
چکیده
We study conditions on f under which an f -divergence Df will satisfy Df ≥ cfV 2 or Df ≥ c2,fV +c4,fV , where V denotes variational distance and the coefficients cf , c2,f and c4,f are best possible. As a consequence, we obtain lower bounds in terms of V for many well known distance and divergence measures. For instance, let D(α)(P,Q) = [α(α− 1)][ ∫ qp dμ− 1] and Iα(P,Q) = (α−1)−1 log[ ∫ pq dμ] be respectively the relative information of type (1−α) and Rényi’s information gain of order α. We show that D(α) ≥ 1 2V 2 + 1 72 (α + 1)(2 − α)V 4 whenever −1 ≤ α ≤ 2, α 6= 0, 1 and that Iα ≥ α2 V 2 + 1 36α(1 + 5α − 5α)V 4 for 0 < α < 1. Pinsker’s inequality D ≥ 1 2 V 2 and its extension D ≥ 12 V 2 + 1 36 V 4 are special cases of each one of these.
منابع مشابه
On Pinsker's Type Inequalities and Csiszar's f-divergences. Part I: Second and Fourth-Order Inequalities
We study conditions on f under which an f -divergence Df will satisfy Df ≥ cfV 2 or Df ≥ c2,fV 2 + c4,fV 4, where V denotes variational distance and the coefficients cf , c2,f and c4,f are best possible. As a consequence, we obtain lower bounds in terms of V for many well known distance and divergence measures. For instance, let D(α)(P,Q) = [α(α−1)]−1[∫ qαp1−α dμ−1] and Iα(P,Q) = (α−1)−1 log[ ∫...
متن کاملSome inequalities for information divergence and related measures of discrimination
Inequalities which connect information divergence with other measures of discrimination or distance between probability distributions are used in information theory and its applications to mathematical statistics, ergodic theory and other scientific fields. We suggest new inequalities of this type, often based on underlying identities. As a consequence we obtain certain improvements of the well...
متن کاملInformation, Divergence and Risk for Binary Experiments
We unify f -divergences, Bregman divergences, surrogate loss bounds (regret bounds), proper scoring rules, matching losses, cost curves, ROC-curves and information. We do this by systematically studying integral and variational representations of these objects and in so doing identify their primitives which all are related to cost-sensitive binary classification. As well as clarifying relations...
متن کامل-Divergences and Related Distances
Derivation of tight bounds on f -divergences and related distances is of interest in information theory and statistics. This paper improves some existing bounds on f -divergences. In some cases, an alternative approach leads to a simplified proof of an existing bound. Following bounds on the chi-squared divergence, an improved version of a reversed Pinsker’s inequality is derived for an arbitra...
متن کاملRelative information of type s, Csiszár's f-divergence, and information inequalities
During past years Dragomir has contributed a lot of work providing different kinds of bounds on the distance, information and divergence measures. In this paper, we have unified some of his results using the relative information of type s and relating it with the Csisz ar’s f -divergence. 2003 Elsevier Inc. All rights reserved.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IEEE Trans. Information Theory
دوره 56 شماره
صفحات -
تاریخ انتشار 2010